Conditional gradient algorithms for norm-regularized smooth convex optimization

نویسندگان

  • Zaïd Harchaoui
  • Anatoli Juditsky
  • Arkadi Nemirovski
چکیده

Motivated by some applications in signal processing and machine learning, we consider two convex optimization problems where, given a cone K , a norm ‖ · ‖ and a smooth convex function f , we want either (1) to minimize the norm over the intersection of the cone and a level set of f , or (2) to minimize over the cone the sum of f and a multiple of the norm. We focus on the case where (a) the dimension of the problem is too large to allow for interior point algorithms, (b) ‖ · ‖ is “too complicated” to allow for computationally cheap Bregman projections required in the first-order proximal gradient algorithms. On the other hand, we assume that it is relatively easy to minimize linear forms over the intersection of K and the unit ‖ · ‖-ball. Motivating examples are given by the nuclear norm with K being the entire space ofmatrices, or the positive semidefinite cone in the space of symmetric matrices, and the Total Variation norm on the space of 2D images. We discuss versions of the Conditional Gradient algorithm capable to handle our problems of interest, provide the related theoretical efficiency estimates and outline some applications. Research of the first and second authors was supported by the CNRS-Mastodons project GARGANTUA, and the LabEx PERSYVAL-Lab (ANR-11-LABX-0025). Research of the third author was supported by the ONR Grant N000140811104 and NSF Grants DMS 0914785, CMMI 1232623. Z. Harchaoui LJK, Inria, 655 Avenue de l’Europe, Montbonnot, 38334 Saint-Ismier, France e-mail: [email protected] A. Juditsky (B) LJK, Université Grenoble Alpes, B.P. 53, 38041 Grenoble Cedex 9, France e-mail: [email protected] A. Nemirovski Georgia Institute of Technology, Atlanta, GA 30332, USA e-mail: [email protected]

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximated Function Based Spectral Gradient Algorithm for Sparse Signal Recovery

Numerical algorithms for the l0-norm regularized non-smooth non-convex minimization problems have recently became a topic of great interest within signal processing, compressive sensing, statistics, and machine learning. Nevertheless, the l0norm makes the problem combinatorial and generally computationally intractable. In this paper, we construct a new surrogate function to approximate l0-norm ...

متن کامل

An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems

The affine rank minimization problem, which consists of finding a matrix of minimum rank subject to linear equality constraints, has been proposed in many areas of engineering and science. A specific rank minimization problem is the matrix completion problem, in which we wish to recover a (low-rank) data matrix from incomplete samples of its entries. A recent convex relaxation of the rank minim...

متن کامل

An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems

The affine rank minimization problem, which consists of finding a matrix of minimum rank subject to linear equality constraints, has been proposed in many areas of engineering and science. A specific rank minimization problem is the matrix completion problem, in which we wish to recover a (low-rank) data matrix from incomplete samples of its entries. A recent convex relaxation of the rank minim...

متن کامل

Composite Objective Mirror Descent

We present a new method for regularized convex optimization and analyze it under both online and stochastic optimization settings. In addition to unifying previously known firstorder algorithms, such as the projected gradient method, mirror descent, and forwardbackward splitting, our method yields new analysis and algorithms. We also derive specific instantiations of our method for commonly use...

متن کامل

Duality between subgradient and conditional gradient methods

Given a convex optimization problem and its dual, there are many possible firstorder algorithms. In this paper, we show the equivalence between mirror descent algorithms and algorithms generalizing the conditional gradient method. This is done through convex duality and implies notably that for certain problems, such as for supervised machine learning problems with nonsmooth losses or problems ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Math. Program.

دوره 152  شماره 

صفحات  -

تاریخ انتشار 2015